Relaxed Wasserstein with Applications to GANs
نویسندگان
چکیده
We propose a novel class of statistical divergences called Relaxed Wasserstein (RW) divergence. RW divergence generalizes Wasserstein distance and is parametrized by strictly convex, differentiable functions. We establish for RW several key probabilistic properties, which are critical for the success of Wasserstein distances. In particular, we show that RW is dominated by Total Variation (TV) and Wasserstein-L distance, and establish continuity, differentiability, and duality representation of RW divergence. Finally, we provide a non-asymptotic moment estimate and a concentration inequality for RW divergence. Our experiments on image generation problems show that RWGANs with Kullback-Leibler (KL) divergence provide competitive performance compared with many state-of-the-art approaches. Empirically, we show that RWGANs possess better convergence properties than WGANs, with competitive inception scores. In comparison to the existing literature in GANs, which are ad-hoc in the choices of cost functions, this new conceptual framework not only provides great flexibility in designing general cost functions, e.g., for applications to GANs, but also allows different cost functions implemented and compared under a unified mathematical framework.
منابع مشابه
Face Super-Resolution Through Wasserstein GANs
Generative adversarial networks (GANs) have received a tremendous amount of attention in the past few years, and have inspired applications addressing a wide range of problems. Despite its great potential, GANs are difficult to train. Recently, a series of papers (Arjovsky & Bottou, 2017a; Arjovsky et al. 2017b; and Gulrajani et al. 2017) proposed using Wasserstein distance as the training obje...
متن کاملManifold-valued Image Generation with Wasserstein Adversarial Networks
Unsupervised image generation has recently received an increasing amount of attention thanks to the great success of generative adversarial networks (GANs), particularly Wasserstein GANs. Inspired by the paradigm of real-valued image generation, this paper makes the first attempt to formulate the problem of generating manifold-valued images, which are frequently encountered in real-world applic...
متن کاملOn reproduction of On the regularization of Wasserstein GANs
This report has several purposes. First, our report is written to investigate the reproducibility of the submitted paper On the regularization of Wasserstein GANs (2018). Second, among the experiments performed in the submitted paper, five aspects were emphasized and reproduced: learning speed, stability, robustness against hyperparameter, estimating the Wasserstein distance, and various sampli...
متن کاملDemystifying MMD GANs
We investigate the training and performance of generative adversarial networks using the Maximum Mean Discrepancy (MMD) as critic, termed MMD GANs. As our main theoretical contribution, we clarify the situation with bias in GAN loss functions raised by recent work: we show that gradient estimators used in the optimization process for both MMD GANs and Wasserstein GANs are unbiased, but learning...
متن کاملSummable Reparameterizations of Wasserstein Critics in the One-Dimensional Setting
Generative adversarial networks (GANs) are an exciting alternative to algorithms for solving density estimation problems—using data to assess how likely samples are to be drawn from the same distribution. Instead of explicitly computing these probabilities, GANs learn a generator that can match the given probabilistic source. This paper looks particularly at this matching capability in the cont...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1705.07164 شماره
صفحات -
تاریخ انتشار 2017